1,922 research outputs found

    Cold Chain Energy Analysis for Sustainable Food and Beverage Supply

    Get PDF
    Perishable goods, such as chilled and frozen foods, have a short shelf life and high sensitivity to their surrounding environment (e.g., temperature, humidity, and light intensity). For this reason, they must be distributed within a specific time and require special equipment and facilities (e.g., refrigeration and dehumidification systems) throughout the entire chain from farm to fork to ensure slow deterioration and to deliver safe and high-quality products to consumers. Cold chains can last for short periods, such as a few hours, or for several months or even years (e.g., frozen food products) depending on the product and the target market. A huge amount of energy is required to preserve quality by maintaining the desired temperature level over time. The required energy is also affected by inventory management policies (e.g., warehouse filling levels affect the cooling demand per unit of a product) and the behavior of the operators (e.g., number and duration of door openings). Furthermore, waste entails the loss of energy and other resources consumed for processing and storing these products. The aim of the present study is to propose a quantitative approach in order to map the energy flows throughout the cold chain in the food and beverage sector and to evaluate the overall energy performance. The results of the energy flow mapping give decisionmakers insights into the minimum energy required by the cold chain and allow them to prioritize energy efficiency measures by detecting the most energy consuming stages of the cold chain. The implementation of a holistic approach, shifting from a single-company perspective to chain assessment, leads to a global optimum and to an increased implementation rate of energy efficiency measures due to the reduced barriers perceived by different actors of the cold chain

    Minimax Classification with 0-1 Loss and Performance Guarantees

    Get PDF
    Supervised classification techniques use training samples to find classification rules with small expected 0-1 loss. Conventional methods achieve efficient learning and out-of-sample generalization by minimizing surrogate losses over specific families of rules. This paper presents minimax risk classifiers (MRCs) that do not rely on a choice of surrogate loss and family of rules. MRCs achieve efficient learning and out-of-sample generalization by minimizing worst-case expected 0-1 loss w.r.t. uncertainty sets that are defined by linear constraints and include the true underlying distribution. In addition, MRCs’ learning stage provides performance guarantees as lower and upper tight bounds for expected 0-1 loss. We also present MRCs’ finite-sample generalization bounds in terms of training size and smallest minimax risk, and show their competitive classification performance w.r.t. state-of-the-art techniques using benchmark datasets.Ramon y Cajal Grant RYC-2016-1938

    Assessment of Energy Efficiency Measures in Food Cold Supply Chains: A Dairy Industry Case Study

    Get PDF
    The quality of human nutrition has acquired significant improvements thanks to the opportunity to store food in suitable temperature conditions. Refrigeration has allowed the slowing of chemical and biological degradation and hence the waste of foodstuff, but at the same time increases energy consumption. These effects impact the environment and the sustainability performance of the cold chain, and drive consumers’ choices. The stakeholders of the chain are, therefore, constantly looking for improvement actions to reduce environmental impacts. This paper aims to provide a methodology for prioritizing and assessing the energy efficiency measures for cold chains in terms of quality losses and specific energy consumption, distinguishing between technological, maintenance, and managerial opportunities. This analysis is based on the cold supply chain tool, developed under the H2020 project ICCEE (“Improving Cold Chain Energy Efficiency”) which focuses on a holistic approach, not looking only at the individual stages of the cold chain. Furthermore, an economic evaluation has been proposed considering cost savings and the investment needed

    Supply chain finance for ameliorating and deteriorating products: a systematic literature review

    Get PDF
    Ameliorating and deteriorating products, or, more generally, items that change value over time, present a high sensitiveness to the surrounding environment (e.g., temperature, humidity, and light intensity). For this reason, they should be properly stored along the supply chain to guarantee the desired quality to the consumers. Specifically, ameliorating items face an increase in value if there are stored for longer periods, which can lead to higher selling price. At the same time, the costumers’ demand is sensitive to the price (i.e., the higher the selling price the lower the final demand), sensitiveness that is related to the quality of the products (i.e., lower sensitiveness for high-quality products). On the contrary, deteriorating items lose quality and value over time which result in revenue losses due to lost sales or reduced selling price. Since these products need to be properly stored (i.e., usually in temperature- and humidity-controlled warehouses) the holding costs, which comprise also the energy costs, may be particularly relevant impacting on the economic, environmental, and social sustainability of the supply chain. Furthermore, due to the recent economic crisis, companies (especially, small and medium enterprises) face payment difficulties of customers and high volatility of resources prices. This increases the risk of insolvency and on the other hand the financing needs. In this context, supply chain finance emerged as a mean for efficiency by coordinating the financial flow and providing a set of financial schemes aiming at optimizing accounts payable and receivable along the supply chain. The aim of the present study is thus to investigate through a systematic literature review the two main themes presented (i.e., inventory management models for products that change value over time, and financial techniques and strategies to support companies in inventory management) to understand if any financial technique has been studied for supporting the management of this class of products and to verify the existing literature gap

    Searching for dominant high-level features for music information retrieval

    Get PDF
    Music Information Retrieval systems are often based on the analysis of a large number of low-level audio features. When dealing with problems of musical genre description and visualization, however, it would be desirable to work with a very limited number of highly informative and discriminant macro-descriptors. In this paper we focus on a speciïŹc class of training-based descriptors, which are obtained as the loglikelihood of a Gaussian Mixture Model trained with short musical excerpts that selectively exhibit a certain semantic homogeneity. As these descriptors are critically dependent on the training sets, we approach the problem of how to automatically generate suitable training sets and optimize the associated macro-features in terms of discriminant power and informative impact. We then show the application of a set of three identiïŹed macro-features to genre visualization, tracking and classiïŹcation

    Detection of Earth-like Planets Using Apodized Telescopes

    Get PDF
    The mission of NASA's Terrestrial Planet Finder (TPF) is to find Earth-like planets orbiting other stars and characterize the atmospheres of these planets using spectroscopy. Because of the enormous brightness ratio between the star and the reflected light from the planet, techniques must be found to reduce the brightness of the star. The current favorite approach to doing this is with interferometry: interfering the light from two or more separated telescopes with a π\pi phase shift, nulling out the starlight. While this technique can, in principle, achieve the required dynamic range, building a space interferometer that has the necessary characteristics poses immense technical difficulties. In this paper, we suggest a much simpler approach to achieving the required dynamic range. By simply adjusting the transmissive shape of a telescope aperture, the intensity in large regions around the stellar image can be reduced nearly to zero. This approach could lead to construction of a TPF using conventional technologies, requiring space optics on a much smaller scale than the current TPF approach.Comment: Accepted for publication in ApJ Letters, 9 pages, 6 figure

    A Data-Driven Model of Tonal Chord Sequence Complexity

    Get PDF

    Frequency-domain nonlinear modeling approaches for power systems components - A comparison

    Get PDF
    Harmonic simulations play a key role in studying and predicting the impact of nonlinear devices on the power quality level of distribution grids. A frequency-domain approach allows higher computational efficiency, which has key importance as long as complex networks have to be studied. However, this requires proper frequency-domain behavioral models able to represent the nonlinear voltage-current relationship characterizing these devices. The Frequency Transfer Matrix (FTM) method is one of the most widespread frequency domain modeling approaches for power system applications. However, others suitable techniques have been developed in the last years, in particular the X-parameters approach, which comes from radiofrequency and microwave applications, and the simplified Volterra models under quasi-sinusoidal conditions, that have been specifically tailored for power system devices. In this paper FTM, X-parameters and simplified Volterra approaches are compared in representing the nonlinear voltage-current relationship of a bridge rectifier feeding an ohmic-capacitive dc load. Results show that the X-parameters model reaches good accuracy, which is slightly better than that achieved by the FTM and simplified Volterra models, but with a considerably larger set of coefficients. Simplified Volterra models under quasi-sinusoidal conditions allows an effective trade-off between accuracy and complexity
    • 

    corecore